LAB 7- Physical Computing, Tangible Bits, and Different Prototype Methods for Authoring Devices
Define 'physical computing' :
Physical computing consists of controlling the physical world through sensors that respond to touch, noise, etc. It enables individuals to make the invisible things visible. For example, Hiroshi Shi created a sensor bottle that communicates what is happening around the world weather wise. This becomes a more tangible representation of data (weather), for it is in a physical form and communicates the unexpected and invisible.
In addition, physical computing actually allows the individual to look at data and understand it in a way that they have never understood it before. For example the Bill Gaver's experimental design that detects the mood of an individual through sensors. These sensors are on a platform on a table, or anywhere that keys and mobile phones are usually placed when walking into the house. It detects the mood the individual is in by sensing how hard the individual places or throws the phone on the platform. This allows the next individual who enters the home to see that the person is in a bad mood and makes sure to handle the situation accordingly to avoid any further conflict. This experiment emphasizes the idea of allowing the individual to see what is unseen, because usually an individual can never sense another individual's mood by just entering the household and looking at a sensor mood detector.
Physical computing also visually shows how humans communicate through computers. In the gesture-based physical computing, the MIT student develops a prototype where everything that is computer-based is substituted into the hands of the student with sensor detectors attached to the forefingers. Having this equipment on the fingers allows the student to interact with the outside world as somewhat of a computer himself. He takes pictures with his fingers, reads the review of books and gets books read to him through his fingers. If he doesn't have a watch, just by drawing the outline of the watch on his wrist, the time appears. Even shaking another individuals hand the student is able to identify information about that individual that he wouldn't know unless the student told him so. All these examples make the invisible data visible for the student.
Joy Mountford also explores and develops the application Quicktime on the computer in order to include images and videos. They developed a prototype including images and making them into videos. The prototype enhanced the ability of the personal computer and moved it towards multimeda. The Quicktime prototype also allows visual representations to move, come to life and play.
The video is about prototyping with d.tools that allow a person to design, test and analyze technological items. Sam, being an interaction designer connects a small LCD screen for display and an accelerometer in order to create a prototype for a hand held GPS device based on orientation sensors. The display screen allows Sam to create interactions with the accelerometer and the display screen. This allows Sam to visualize what the program is doing at the moment. So if he moves the accelerometer up or down, the visual representation on the display screen will allow Sam to notice what exactly he is doing and how it works.
The designer in this case tests programs on the tools he has through his desktop in order to see if the new device will work. Doing a visual mapping of the programing allows the designer to check the programing through interaction designs before it is made. This is basically an example of trouble shooting an item before money is spent on making it and then checking if it actually works.
Exemplar: Authoring Sensor-based Interactions by Demonstration
The exemplar protptype interactions involving sesnors. Reviews and edits a visual representation of an action or movement. It therefore shows visualization of the program that matches the demonstration. The visual representation influences Dan, an interaction designer try to prototype a helmet with right and left sensor detectors on it. Tilting the head left or right will allow the sensors to turn on based on the direction of the head. A remote is also present in order to press a button if the cyclist would like to turn left or right. This remote can be turned on and off, depending on what the cyclist prefers.
An advanced feature with exemplar is that it generalizes thresholds. Therefore, it controls the exact tilt of the head in order for the sensors to turn on. Using exemplar a user is able to create their own way designing sensor materials and simultaneously are able to visually witness the representations. This makes the prototype much easier and less expensive.
LAB 7- Physical Computing, Tangible Bits, and Different Prototype Methods for Authoring Devices
Define 'physical computing' :
Physical computing consists of controlling the physical world through sensors that respond to touch, noise, etc. It enables individuals to make the invisible things visible. For example, Hiroshi Shi created a sensor bottle that communicates what is happening around the world weather wise. This becomes a more tangible representation of data (weather), for it is in a physical form and communicates the unexpected and invisible.
In addition, physical computing actually allows the individual to look at data and understand it in a way that they have never understood it before. For example the Bill Gaver's experimental design that detects the mood of an individual through sensors. These sensors are on a platform on a table, or anywhere that keys and mobile phones are usually placed when walking into the house. It detects the mood the individual is in by sensing how hard the individual places or throws the phone on the platform. This allows the next individual who enters the home to see that the person is in a bad mood and makes sure to handle the situation accordingly to avoid any further conflict. This experiment emphasizes the idea of allowing the individual to see what is unseen, because usually an individual can never sense another individual's mood by just entering the household and looking at a sensor mood detector.
Physical computing also visually shows how humans communicate through computers. In the gesture-based physical computing, the MIT student develops a prototype where everything that is computer-based is substituted into the hands of the student with sensor detectors attached to the forefingers. Having this equipment on the fingers allows the student to interact with the outside world as somewhat of a computer himself. He takes pictures with his fingers, reads the review of books and gets books read to him through his fingers. If he doesn't have a watch, just by drawing the outline of the watch on his wrist, the time appears. Even shaking another individuals hand the student is able to identify information about that individual that he wouldn't know unless the student told him so. All these examples make the invisible data visible for the student.
Joy Mountford also explores and develops the application Quicktime on the computer in order to include images and videos. They developed a prototype including images and making them into videos. The prototype enhanced the ability of the personal computer and moved it towards multimeda. The Quicktime prototype also allows visual representations to move, come to life and play.
Dynamic Experience Prototypes d.tools: Reflective Physical Prototyping
The video is about prototyping with d.tools that allow a person to design, test and analyze technological items. Sam, being an interaction designer connects a small LCD screen for display and an accelerometer in order to create a prototype for a hand held GPS device based on orientation sensors. The display screen allows Sam to create interactions with the accelerometer and the display screen. This allows Sam to visualize what the program is doing at the moment. So if he moves the accelerometer up or down, the visual representation on the display screen will allow Sam to notice what exactly he is doing and how it works.
The designer in this case tests programs on the tools he has through his desktop in order to see if the new device will work. Doing a visual mapping of the programing allows the designer to check the programing through interaction designs before it is made. This is basically an example of trouble shooting an item before money is spent on making it and then checking if it actually works.
Exemplar: Authoring Sensor-based Interactions by Demonstration
The exemplar protptype interactions involving sesnors. Reviews and edits a visual representation of an action or movement. It therefore shows visualization of the program that matches the demonstration. The visual representation influences Dan, an interaction designer try to prototype a helmet with right and left sensor detectors on it. Tilting the head left or right will allow the sensors to turn on based on the direction of the head. A remote is also present in order to press a button if the cyclist would like to turn left or right. This remote can be turned on and off, depending on what the cyclist prefers.
An advanced feature with exemplar is that it generalizes thresholds. Therefore, it controls the exact tilt of the head in order for the sensors to turn on. Using exemplar a user is able to create their own way designing sensor materials and simultaneously are able to visually witness the representations. This makes the prototype much easier and less expensive.